Objective:

Given a Bank customer, build a neural network based classifier that can determine whether they will leave or not in the next 6 months.

  • Context: Businesses like banks which provide service have to worry about problem of 'Churn' i.e. customers leaving and joining another service provider. It is important to understand which aspects of the service influence a customer's decision in this regard. Management can concentrate efforts on improvement of service, keeping in mind these priorities.

https://github.com/GreatLearningAIML1/gl-pgp-aiml-uta-intl-may20-ssetty3.git

In [1]:
import pandas as pd
import numpy as np 

import matplotlib.pyplot as plt 
%matplotlib inline
from mpl_toolkits.mplot3d import Axes3D
import itertools
from scipy.stats import norm 
import seaborn as sns

import warnings
warnings.filterwarnings("ignore")
In [2]:
df = pd. read_csv ('bank.csv')
df.head()
Out[2]:
RowNumber CustomerId Surname CreditScore Geography Gender Age Tenure Balance NumOfProducts HasCrCard IsActiveMember EstimatedSalary Exited
0 1 15634602 Hargrave 619 France Female 42 2 0.00 1 1 1 101348.88 1
1 2 15647311 Hill 608 Spain Female 41 1 83807.86 1 0 1 112542.58 0
2 3 15619304 Onio 502 France Female 42 8 159660.80 3 1 0 113931.57 1
3 4 15701354 Boni 699 France Female 39 1 0.00 2 0 0 93826.63 0
4 5 15737888 Mitchell 850 Spain Female 43 2 125510.82 1 1 1 79084.10 0
In [3]:
df.info()
<class 'pandas.core.frame.DataFrame'>
RangeIndex: 10000 entries, 0 to 9999
Data columns (total 14 columns):
 #   Column           Non-Null Count  Dtype  
---  ------           --------------  -----  
 0   RowNumber        10000 non-null  int64  
 1   CustomerId       10000 non-null  int64  
 2   Surname          10000 non-null  object 
 3   CreditScore      10000 non-null  int64  
 4   Geography        10000 non-null  object 
 5   Gender           10000 non-null  object 
 6   Age              10000 non-null  int64  
 7   Tenure           10000 non-null  int64  
 8   Balance          10000 non-null  float64
 9   NumOfProducts    10000 non-null  int64  
 10  HasCrCard        10000 non-null  int64  
 11  IsActiveMember   10000 non-null  int64  
 12  EstimatedSalary  10000 non-null  float64
 13  Exited           10000 non-null  int64  
dtypes: float64(2), int64(9), object(3)
memory usage: 1.1+ MB
In [4]:
df.describe()
Out[4]:
RowNumber CustomerId CreditScore Age Tenure Balance NumOfProducts HasCrCard IsActiveMember EstimatedSalary Exited
count 10000.00000 1.000000e+04 10000.000000 10000.000000 10000.000000 10000.000000 10000.000000 10000.00000 10000.000000 10000.000000 10000.000000
mean 5000.50000 1.569094e+07 650.528800 38.921800 5.012800 76485.889288 1.530200 0.70550 0.515100 100090.239881 0.203700
std 2886.89568 7.193619e+04 96.653299 10.487806 2.892174 62397.405202 0.581654 0.45584 0.499797 57510.492818 0.402769
min 1.00000 1.556570e+07 350.000000 18.000000 0.000000 0.000000 1.000000 0.00000 0.000000 11.580000 0.000000
25% 2500.75000 1.562853e+07 584.000000 32.000000 3.000000 0.000000 1.000000 0.00000 0.000000 51002.110000 0.000000
50% 5000.50000 1.569074e+07 652.000000 37.000000 5.000000 97198.540000 1.000000 1.00000 1.000000 100193.915000 0.000000
75% 7500.25000 1.575323e+07 718.000000 44.000000 7.000000 127644.240000 2.000000 1.00000 1.000000 149388.247500 0.000000
max 10000.00000 1.581569e+07 850.000000 92.000000 10.000000 250898.090000 4.000000 1.00000 1.000000 199992.480000 1.000000
In [5]:
# check missing values
print ('The missing values are >>> \n', df.isnull().sum())
The missing values are >>> 
 RowNumber          0
CustomerId         0
Surname            0
CreditScore        0
Geography          0
Gender             0
Age                0
Tenure             0
Balance            0
NumOfProducts      0
HasCrCard          0
IsActiveMember     0
EstimatedSalary    0
Exited             0
dtype: int64
In [6]:
# pandas_profiling 
from pandas_profiling import ProfileReport
In [7]:
profile = ProfileReport(df, title=" Bank Customer Pandas Profiling Report")
profile.to_file("Bank Customer Pandas Profiling Report.html")




In [8]:
profile
Out[8]:

In [6]:
# Drop the columns as explored above, have no corelation seems dumb data
df1 = df.drop(["RowNumber", "CustomerId", "Surname"], axis = 1)
In [7]:
df1.head()
Out[7]:
CreditScore Geography Gender Age Tenure Balance NumOfProducts HasCrCard IsActiveMember EstimatedSalary Exited
0 619 France Female 42 2 0.00 1 1 1 101348.88 1
1 608 Spain Female 41 1 83807.86 1 0 1 112542.58 0
2 502 France Female 42 8 159660.80 3 1 0 113931.57 1
3 699 France Female 39 1 0.00 2 0 0 93826.63 0
4 850 Spain Female 43 2 125510.82 1 1 1 79084.10 0
In [8]:
#To Understand the customer base and total opportunity with given data.
sns.countplot(df1['Exited'])
Out[8]:
<matplotlib.axes._subplots.AxesSubplot at 0x297f0bf8280>
In [10]:
    # We first review the 'Status' relation with categorical variables
    fig, axarr = plt.subplots(2, 2, figsize=(20, 12))
    sns.countplot(x='Geography', hue = 'Exited',data = df1, ax=axarr[0][0])
    sns.countplot(x='Gender', hue = 'Exited',data = df1, ax=axarr[0][1])
    sns.countplot(x='HasCrCard', hue = 'Exited',data = df1, ax=axarr[1][0])
    sns.countplot(x='IsActiveMember', hue = 'Exited',data = df1, ax=axarr[1][1])
Out[10]:
<matplotlib.axes._subplots.AxesSubplot at 0x297f6106070>

Observation: The proportion of female customers Leaving is greater than that of male customers With respect to geography,most of the people are from France and Spain. majority of the customers thatLeaving are those with credit cards and the inactive members have a greater churn.

In [11]:
#include relevant columns within x and y
x = df [['CreditScore','Geography','Gender','Age','Tenure','Balance','NumOfProducts','HasCrCard','IsActiveMember',
        'EstimatedSalary']]
y = df[['Exited']]
In [12]:
x.head()
Out[12]:
CreditScore Geography Gender Age Tenure Balance NumOfProducts HasCrCard IsActiveMember EstimatedSalary
0 619 France Female 42 2 0.00 1 1 1 101348.88
1 608 Spain Female 41 1 83807.86 1 0 1 112542.58
2 502 France Female 42 8 159660.80 3 1 0 113931.57
3 699 France Female 39 1 0.00 2 0 0 93826.63
4 850 Spain Female 43 2 125510.82 1 1 1 79084.10
In [13]:
y.head()
Out[13]:
Exited
0 1
1 0
2 1
3 0
4 0
In [14]:
y.info()
<class 'pandas.core.frame.DataFrame'>
RangeIndex: 10000 entries, 0 to 9999
Data columns (total 1 columns):
 #   Column  Non-Null Count  Dtype
---  ------  --------------  -----
 0   Exited  10000 non-null  int64
dtypes: int64(1)
memory usage: 78.2 KB
In [15]:
# Geography  10000  object &  Gender 10000   object 
# deal with Object data --> encode them

from sklearn.preprocessing import LabelEncoder
labelencoder_x = LabelEncoder()
x.iloc[:, 1] = labelencoder_x.fit_transform(x.iloc[:, 1]) #applying on Geography
In [16]:
labelencoder_x_2 = LabelEncoder()
x.iloc[:, 2] = labelencoder_x_2.fit_transform(x.iloc[:, 2]) #applying on Gender
In [17]:
x.head()
Out[17]:
CreditScore Geography Gender Age Tenure Balance NumOfProducts HasCrCard IsActiveMember EstimatedSalary
0 619 0 0 42 2 0.00 1 1 1 101348.88
1 608 2 0 41 1 83807.86 1 0 1 112542.58
2 502 0 0 42 8 159660.80 3 1 0 113931.57
3 699 0 0 39 1 0.00 2 0 0 93826.63
4 850 2 0 43 2 125510.82 1 1 1 79084.10
In [20]:
g=sns.pairplot(x,diag_kind="kde")
g.map_lower(sns.kdeplot, levels=4, color=".2")
Out[20]:
<seaborn.axisgrid.PairGrid at 0x27db6a7cb80>
In [18]:
plt.figure(figsize = (20,20))
sns.heatmap(x.corr(), annot=True,linewidths=0.2,annot_kws={'size':11})
Out[18]:
<matplotlib.axes._subplots.AxesSubplot at 0x297f57aa580>
In [19]:
# Slplit data test and Train
from sklearn.model_selection import train_test_split
from sklearn.preprocessing import StandardScaler
from sklearn import preprocessing

X_train, X_test, y_train, y_test = train_test_split(x, y, test_size=0.20, random_state=1)
In [20]:
scaler = preprocessing.MinMaxScaler()
# MinMaxScalar has been used here. 
#fitting the transform on test and train separately
X_train = scaler.fit_transform(X_train)
X_test = scaler.transform(X_test)
In [24]:
import keras
from keras.models import Sequential
from keras.layers import Dense, Dropout #to add layers
In [26]:
# This adds the input layer (by specifying input dimension) AND the first hidden layer (units)
classifier.add(Dense(6, activation = 'relu', input_shape = (X_train.shape[1], )))
classifier.add(Dropout(rate=0.1)) 
In [27]:
# Adding the second hidden layer
classifier.add(Dense(6, activation = 'relu')) 
classifier.add(Dropout(rate=0.1))
In [28]:
# Adding the output layer
# We use the sigmoid because we want probability outcomes
classifier.add(Dense(1, activation = 'sigmoid')) 
In [29]:
classifier.summary()
Model: "sequential_2"
_________________________________________________________________
Layer (type)                 Output Shape              Param #   
=================================================================
dense (Dense)                (None, 6)                 66        
_________________________________________________________________
dropout (Dropout)            (None, 6)                 0         
_________________________________________________________________
dense_1 (Dense)              (None, 6)                 42        
_________________________________________________________________
dropout_1 (Dropout)          (None, 6)                 0         
_________________________________________________________________
dense_2 (Dense)              (None, 1)                 7         
=================================================================
Total params: 115
Trainable params: 115
Non-trainable params: 0
_________________________________________________________________
In [34]:
#compile the model --> backpropagation -> gradient descent
classifier.compile(optimizer = 'adam', loss = "binary_crossentropy", metrics = ['accuracy'])
In [35]:
# fitting the NN to training set.
history = classifier.fit(X_train, y_train, batch_size=32, epochs=200, validation_split=0.1, verbose=2)
Epoch 1/200
225/225 - 1s - loss: 0.5502 - accuracy: 0.7568 - val_loss: 0.5257 - val_accuracy: 0.7788
Epoch 2/200
225/225 - 0s - loss: 0.5075 - accuracy: 0.7962 - val_loss: 0.5174 - val_accuracy: 0.7788
Epoch 3/200
225/225 - 0s - loss: 0.4961 - accuracy: 0.7983 - val_loss: 0.5117 - val_accuracy: 0.7788
Epoch 4/200
225/225 - 0s - loss: 0.4912 - accuracy: 0.7993 - val_loss: 0.5087 - val_accuracy: 0.7788
Epoch 5/200
225/225 - 0s - loss: 0.4868 - accuracy: 0.7993 - val_loss: 0.5031 - val_accuracy: 0.7788
Epoch 6/200
225/225 - 0s - loss: 0.4808 - accuracy: 0.7993 - val_loss: 0.4997 - val_accuracy: 0.7788
Epoch 7/200
225/225 - 0s - loss: 0.4771 - accuracy: 0.7993 - val_loss: 0.4952 - val_accuracy: 0.7788
Epoch 8/200
225/225 - 0s - loss: 0.4728 - accuracy: 0.7993 - val_loss: 0.4922 - val_accuracy: 0.7788
Epoch 9/200
225/225 - 0s - loss: 0.4690 - accuracy: 0.7997 - val_loss: 0.4882 - val_accuracy: 0.7812
Epoch 10/200
225/225 - 0s - loss: 0.4632 - accuracy: 0.8039 - val_loss: 0.4861 - val_accuracy: 0.7837
Epoch 11/200
225/225 - 0s - loss: 0.4583 - accuracy: 0.8056 - val_loss: 0.4803 - val_accuracy: 0.7900
Epoch 12/200
225/225 - 0s - loss: 0.4575 - accuracy: 0.8093 - val_loss: 0.4781 - val_accuracy: 0.7912
Epoch 13/200
225/225 - 0s - loss: 0.4515 - accuracy: 0.8101 - val_loss: 0.4772 - val_accuracy: 0.7925
Epoch 14/200
225/225 - 0s - loss: 0.4479 - accuracy: 0.8093 - val_loss: 0.4780 - val_accuracy: 0.7912
Epoch 15/200
225/225 - 0s - loss: 0.4465 - accuracy: 0.8114 - val_loss: 0.4740 - val_accuracy: 0.7925
Epoch 16/200
225/225 - 0s - loss: 0.4435 - accuracy: 0.8140 - val_loss: 0.4709 - val_accuracy: 0.8050
Epoch 17/200
225/225 - 0s - loss: 0.4452 - accuracy: 0.8126 - val_loss: 0.4691 - val_accuracy: 0.8037
Epoch 18/200
225/225 - 0s - loss: 0.4403 - accuracy: 0.8158 - val_loss: 0.4744 - val_accuracy: 0.7987
Epoch 19/200
225/225 - 0s - loss: 0.4359 - accuracy: 0.8171 - val_loss: 0.4743 - val_accuracy: 0.7962
Epoch 20/200
225/225 - 0s - loss: 0.4357 - accuracy: 0.8163 - val_loss: 0.4660 - val_accuracy: 0.8075
Epoch 21/200
225/225 - 0s - loss: 0.4362 - accuracy: 0.8203 - val_loss: 0.4653 - val_accuracy: 0.8075
Epoch 22/200
225/225 - 0s - loss: 0.4351 - accuracy: 0.8185 - val_loss: 0.4646 - val_accuracy: 0.8062
Epoch 23/200
225/225 - 0s - loss: 0.4350 - accuracy: 0.8190 - val_loss: 0.4695 - val_accuracy: 0.8062
Epoch 24/200
225/225 - 0s - loss: 0.4341 - accuracy: 0.8201 - val_loss: 0.4652 - val_accuracy: 0.8075
Epoch 25/200
225/225 - 0s - loss: 0.4318 - accuracy: 0.8219 - val_loss: 0.4643 - val_accuracy: 0.8087
Epoch 26/200
225/225 - 0s - loss: 0.4259 - accuracy: 0.8268 - val_loss: 0.4650 - val_accuracy: 0.8075
Epoch 27/200
225/225 - 0s - loss: 0.4249 - accuracy: 0.8238 - val_loss: 0.4644 - val_accuracy: 0.8087
Epoch 28/200
225/225 - 0s - loss: 0.4306 - accuracy: 0.8239 - val_loss: 0.4639 - val_accuracy: 0.8075
Epoch 29/200
225/225 - 0s - loss: 0.4233 - accuracy: 0.8250 - val_loss: 0.4645 - val_accuracy: 0.8100
Epoch 30/200
225/225 - 0s - loss: 0.4221 - accuracy: 0.8243 - val_loss: 0.4642 - val_accuracy: 0.8075
Epoch 31/200
225/225 - 0s - loss: 0.4285 - accuracy: 0.8246 - val_loss: 0.4604 - val_accuracy: 0.8075
Epoch 32/200
225/225 - 0s - loss: 0.4250 - accuracy: 0.8246 - val_loss: 0.4607 - val_accuracy: 0.8100
Epoch 33/200
225/225 - 0s - loss: 0.4238 - accuracy: 0.8246 - val_loss: 0.4615 - val_accuracy: 0.8125
Epoch 34/200
225/225 - 0s - loss: 0.4214 - accuracy: 0.8264 - val_loss: 0.4611 - val_accuracy: 0.8062
Epoch 35/200
225/225 - 0s - loss: 0.4224 - accuracy: 0.8271 - val_loss: 0.4587 - val_accuracy: 0.8138
Epoch 36/200
225/225 - 0s - loss: 0.4213 - accuracy: 0.8254 - val_loss: 0.4605 - val_accuracy: 0.8100
Epoch 37/200
225/225 - 0s - loss: 0.4196 - accuracy: 0.8253 - val_loss: 0.4594 - val_accuracy: 0.8125
Epoch 38/200
225/225 - 0s - loss: 0.4169 - accuracy: 0.8299 - val_loss: 0.4605 - val_accuracy: 0.8163
Epoch 39/200
225/225 - 0s - loss: 0.4177 - accuracy: 0.8272 - val_loss: 0.4597 - val_accuracy: 0.8138
Epoch 40/200
225/225 - 0s - loss: 0.4192 - accuracy: 0.8303 - val_loss: 0.4617 - val_accuracy: 0.8100
Epoch 41/200
225/225 - 0s - loss: 0.4158 - accuracy: 0.8306 - val_loss: 0.4583 - val_accuracy: 0.8112
Epoch 42/200
225/225 - 0s - loss: 0.4175 - accuracy: 0.8297 - val_loss: 0.4594 - val_accuracy: 0.8112
Epoch 43/200
225/225 - 0s - loss: 0.4202 - accuracy: 0.8315 - val_loss: 0.4584 - val_accuracy: 0.8125
Epoch 44/200
225/225 - 0s - loss: 0.4154 - accuracy: 0.8335 - val_loss: 0.4576 - val_accuracy: 0.8112
Epoch 45/200
225/225 - 0s - loss: 0.4178 - accuracy: 0.8296 - val_loss: 0.4587 - val_accuracy: 0.8112
Epoch 46/200
225/225 - 0s - loss: 0.4145 - accuracy: 0.8314 - val_loss: 0.4584 - val_accuracy: 0.8075
Epoch 47/200
225/225 - 0s - loss: 0.4156 - accuracy: 0.8340 - val_loss: 0.4581 - val_accuracy: 0.8062
Epoch 48/200
225/225 - 0s - loss: 0.4160 - accuracy: 0.8322 - val_loss: 0.4588 - val_accuracy: 0.8087
Epoch 49/200
225/225 - 0s - loss: 0.4131 - accuracy: 0.8294 - val_loss: 0.4600 - val_accuracy: 0.8125
Epoch 50/200
225/225 - 0s - loss: 0.4146 - accuracy: 0.8307 - val_loss: 0.4578 - val_accuracy: 0.8100
Epoch 51/200
225/225 - 0s - loss: 0.4136 - accuracy: 0.8331 - val_loss: 0.4571 - val_accuracy: 0.8075
Epoch 52/200
225/225 - 0s - loss: 0.4112 - accuracy: 0.8347 - val_loss: 0.4605 - val_accuracy: 0.8087
Epoch 53/200
225/225 - 0s - loss: 0.4121 - accuracy: 0.8340 - val_loss: 0.4588 - val_accuracy: 0.8100
Epoch 54/200
225/225 - 0s - loss: 0.4118 - accuracy: 0.8342 - val_loss: 0.4580 - val_accuracy: 0.8037
Epoch 55/200
225/225 - 0s - loss: 0.4138 - accuracy: 0.8356 - val_loss: 0.4589 - val_accuracy: 0.8100
Epoch 56/200
225/225 - 0s - loss: 0.4112 - accuracy: 0.8331 - val_loss: 0.4595 - val_accuracy: 0.8062
Epoch 57/200
225/225 - 0s - loss: 0.4135 - accuracy: 0.8318 - val_loss: 0.4590 - val_accuracy: 0.8075
Epoch 58/200
225/225 - 0s - loss: 0.4145 - accuracy: 0.8332 - val_loss: 0.4587 - val_accuracy: 0.8125
Epoch 59/200
225/225 - 0s - loss: 0.4127 - accuracy: 0.8340 - val_loss: 0.4582 - val_accuracy: 0.8100
Epoch 60/200
225/225 - 0s - loss: 0.4113 - accuracy: 0.8332 - val_loss: 0.4643 - val_accuracy: 0.8062
Epoch 61/200
225/225 - 0s - loss: 0.4104 - accuracy: 0.8353 - val_loss: 0.4587 - val_accuracy: 0.8087
Epoch 62/200
225/225 - 0s - loss: 0.4098 - accuracy: 0.8329 - val_loss: 0.4582 - val_accuracy: 0.8075
Epoch 63/200
225/225 - 0s - loss: 0.4108 - accuracy: 0.8326 - val_loss: 0.4567 - val_accuracy: 0.8112
Epoch 64/200
225/225 - 0s - loss: 0.4105 - accuracy: 0.8347 - val_loss: 0.4564 - val_accuracy: 0.8075
Epoch 65/200
225/225 - 0s - loss: 0.4091 - accuracy: 0.8354 - val_loss: 0.4590 - val_accuracy: 0.8062
Epoch 66/200
225/225 - 0s - loss: 0.4113 - accuracy: 0.8336 - val_loss: 0.4538 - val_accuracy: 0.8062
Epoch 67/200
225/225 - 0s - loss: 0.4115 - accuracy: 0.8317 - val_loss: 0.4599 - val_accuracy: 0.8087
Epoch 68/200
225/225 - 0s - loss: 0.4095 - accuracy: 0.8342 - val_loss: 0.4541 - val_accuracy: 0.8075
Epoch 69/200
225/225 - 0s - loss: 0.4085 - accuracy: 0.8342 - val_loss: 0.4574 - val_accuracy: 0.8037
Epoch 70/200
225/225 - 0s - loss: 0.4109 - accuracy: 0.8351 - val_loss: 0.4545 - val_accuracy: 0.8062
Epoch 71/200
225/225 - 0s - loss: 0.4090 - accuracy: 0.8342 - val_loss: 0.4576 - val_accuracy: 0.8062
Epoch 72/200
225/225 - 0s - loss: 0.4093 - accuracy: 0.8339 - val_loss: 0.4589 - val_accuracy: 0.8075
Epoch 73/200
225/225 - 0s - loss: 0.4112 - accuracy: 0.8357 - val_loss: 0.4621 - val_accuracy: 0.8100
Epoch 74/200
225/225 - 0s - loss: 0.4101 - accuracy: 0.8344 - val_loss: 0.4619 - val_accuracy: 0.8025
Epoch 75/200
225/225 - 0s - loss: 0.4101 - accuracy: 0.8322 - val_loss: 0.4598 - val_accuracy: 0.8075
Epoch 76/200
225/225 - 0s - loss: 0.4089 - accuracy: 0.8356 - val_loss: 0.4542 - val_accuracy: 0.8075
Epoch 77/200
225/225 - 0s - loss: 0.4095 - accuracy: 0.8318 - val_loss: 0.4533 - val_accuracy: 0.8062
Epoch 78/200
225/225 - 0s - loss: 0.4095 - accuracy: 0.8331 - val_loss: 0.4561 - val_accuracy: 0.8100
Epoch 79/200
225/225 - 0s - loss: 0.4087 - accuracy: 0.8353 - val_loss: 0.4627 - val_accuracy: 0.8062
Epoch 80/200
225/225 - 0s - loss: 0.4083 - accuracy: 0.8335 - val_loss: 0.4541 - val_accuracy: 0.8050
Epoch 81/200
225/225 - 0s - loss: 0.4087 - accuracy: 0.8347 - val_loss: 0.4592 - val_accuracy: 0.8075
Epoch 82/200
225/225 - 0s - loss: 0.4079 - accuracy: 0.8365 - val_loss: 0.4568 - val_accuracy: 0.8050
Epoch 83/200
225/225 - 0s - loss: 0.4084 - accuracy: 0.8369 - val_loss: 0.4504 - val_accuracy: 0.8087
Epoch 84/200
225/225 - 0s - loss: 0.4104 - accuracy: 0.8336 - val_loss: 0.4576 - val_accuracy: 0.8075
Epoch 85/200
225/225 - 0s - loss: 0.4066 - accuracy: 0.8378 - val_loss: 0.4549 - val_accuracy: 0.8100
Epoch 86/200
225/225 - 0s - loss: 0.4098 - accuracy: 0.8353 - val_loss: 0.4548 - val_accuracy: 0.8112
Epoch 87/200
225/225 - 0s - loss: 0.4062 - accuracy: 0.8360 - val_loss: 0.4508 - val_accuracy: 0.8100
Epoch 88/200
225/225 - 0s - loss: 0.4057 - accuracy: 0.8386 - val_loss: 0.4519 - val_accuracy: 0.8112
Epoch 89/200
225/225 - 0s - loss: 0.4067 - accuracy: 0.8367 - val_loss: 0.4482 - val_accuracy: 0.8125
Epoch 90/200
225/225 - 0s - loss: 0.4085 - accuracy: 0.8354 - val_loss: 0.4518 - val_accuracy: 0.8112
Epoch 91/200
225/225 - 0s - loss: 0.4045 - accuracy: 0.8407 - val_loss: 0.4592 - val_accuracy: 0.8112
Epoch 92/200
225/225 - 0s - loss: 0.4017 - accuracy: 0.8399 - val_loss: 0.4527 - val_accuracy: 0.8112
Epoch 93/200
225/225 - 0s - loss: 0.4051 - accuracy: 0.8379 - val_loss: 0.4511 - val_accuracy: 0.8087
Epoch 94/200
225/225 - 0s - loss: 0.4015 - accuracy: 0.8414 - val_loss: 0.4532 - val_accuracy: 0.8150
Epoch 95/200
225/225 - 0s - loss: 0.4046 - accuracy: 0.8390 - val_loss: 0.4464 - val_accuracy: 0.8163
Epoch 96/200
225/225 - 0s - loss: 0.4050 - accuracy: 0.8379 - val_loss: 0.4488 - val_accuracy: 0.8125
Epoch 97/200
225/225 - 0s - loss: 0.4032 - accuracy: 0.8390 - val_loss: 0.4511 - val_accuracy: 0.8138
Epoch 98/200
225/225 - 0s - loss: 0.4022 - accuracy: 0.8404 - val_loss: 0.4480 - val_accuracy: 0.8112
Epoch 99/200
225/225 - 0s - loss: 0.4037 - accuracy: 0.8388 - val_loss: 0.4442 - val_accuracy: 0.8175
Epoch 100/200
225/225 - 0s - loss: 0.4040 - accuracy: 0.8393 - val_loss: 0.4514 - val_accuracy: 0.8087
Epoch 101/200
225/225 - 0s - loss: 0.4022 - accuracy: 0.8408 - val_loss: 0.4543 - val_accuracy: 0.8138
Epoch 102/200
225/225 - 0s - loss: 0.4005 - accuracy: 0.8397 - val_loss: 0.4508 - val_accuracy: 0.8175
Epoch 103/200
225/225 - 0s - loss: 0.4013 - accuracy: 0.8393 - val_loss: 0.4473 - val_accuracy: 0.8188
Epoch 104/200
225/225 - 0s - loss: 0.4025 - accuracy: 0.8364 - val_loss: 0.4535 - val_accuracy: 0.8150
Epoch 105/200
225/225 - 0s - loss: 0.4026 - accuracy: 0.8401 - val_loss: 0.4449 - val_accuracy: 0.8163
Epoch 106/200
225/225 - 0s - loss: 0.3987 - accuracy: 0.8419 - val_loss: 0.4495 - val_accuracy: 0.8138
Epoch 107/200
225/225 - 0s - loss: 0.3983 - accuracy: 0.8421 - val_loss: 0.4445 - val_accuracy: 0.8200
Epoch 108/200
225/225 - 0s - loss: 0.3992 - accuracy: 0.8415 - val_loss: 0.4501 - val_accuracy: 0.8087
Epoch 109/200
225/225 - 0s - loss: 0.4017 - accuracy: 0.8382 - val_loss: 0.4470 - val_accuracy: 0.8138
Epoch 110/200
225/225 - 0s - loss: 0.3982 - accuracy: 0.8435 - val_loss: 0.4492 - val_accuracy: 0.8112
Epoch 111/200
225/225 - 0s - loss: 0.4016 - accuracy: 0.8415 - val_loss: 0.4464 - val_accuracy: 0.8138
Epoch 112/200
225/225 - 0s - loss: 0.3999 - accuracy: 0.8424 - val_loss: 0.4438 - val_accuracy: 0.8175
Epoch 113/200
225/225 - 0s - loss: 0.4008 - accuracy: 0.8393 - val_loss: 0.4442 - val_accuracy: 0.8175
Epoch 114/200
225/225 - 0s - loss: 0.3976 - accuracy: 0.8447 - val_loss: 0.4409 - val_accuracy: 0.8150
Epoch 115/200
225/225 - 0s - loss: 0.4008 - accuracy: 0.8414 - val_loss: 0.4421 - val_accuracy: 0.8163
Epoch 116/200
225/225 - 0s - loss: 0.4001 - accuracy: 0.8386 - val_loss: 0.4482 - val_accuracy: 0.8188
Epoch 117/200
225/225 - 0s - loss: 0.4020 - accuracy: 0.8400 - val_loss: 0.4469 - val_accuracy: 0.8150
Epoch 118/200
225/225 - 0s - loss: 0.4007 - accuracy: 0.8407 - val_loss: 0.4445 - val_accuracy: 0.8188
Epoch 119/200
225/225 - 0s - loss: 0.4019 - accuracy: 0.8390 - val_loss: 0.4408 - val_accuracy: 0.8163
Epoch 120/200
225/225 - 0s - loss: 0.3991 - accuracy: 0.8431 - val_loss: 0.4449 - val_accuracy: 0.8213
Epoch 121/200
225/225 - 0s - loss: 0.3993 - accuracy: 0.8407 - val_loss: 0.4467 - val_accuracy: 0.8213
Epoch 122/200
225/225 - 0s - loss: 0.4001 - accuracy: 0.8385 - val_loss: 0.4462 - val_accuracy: 0.8188
Epoch 123/200
225/225 - 0s - loss: 0.3984 - accuracy: 0.8396 - val_loss: 0.4460 - val_accuracy: 0.8175
Epoch 124/200
225/225 - 0s - loss: 0.4001 - accuracy: 0.8383 - val_loss: 0.4470 - val_accuracy: 0.8200
Epoch 125/200
225/225 - 0s - loss: 0.3971 - accuracy: 0.8401 - val_loss: 0.4449 - val_accuracy: 0.8200
Epoch 126/200
225/225 - 0s - loss: 0.3987 - accuracy: 0.8399 - val_loss: 0.4484 - val_accuracy: 0.8188
Epoch 127/200
225/225 - 0s - loss: 0.3988 - accuracy: 0.8414 - val_loss: 0.4464 - val_accuracy: 0.8150
Epoch 128/200
225/225 - 0s - loss: 0.3998 - accuracy: 0.8353 - val_loss: 0.4417 - val_accuracy: 0.8200
Epoch 129/200
225/225 - 0s - loss: 0.4009 - accuracy: 0.8383 - val_loss: 0.4497 - val_accuracy: 0.8112
Epoch 130/200
225/225 - 0s - loss: 0.3972 - accuracy: 0.8410 - val_loss: 0.4395 - val_accuracy: 0.8213
Epoch 131/200
225/225 - 0s - loss: 0.3971 - accuracy: 0.8411 - val_loss: 0.4519 - val_accuracy: 0.8125
Epoch 132/200
225/225 - 0s - loss: 0.4005 - accuracy: 0.8396 - val_loss: 0.4474 - val_accuracy: 0.8200
Epoch 133/200
225/225 - 0s - loss: 0.3974 - accuracy: 0.8381 - val_loss: 0.4442 - val_accuracy: 0.8213
Epoch 134/200
225/225 - 0s - loss: 0.4009 - accuracy: 0.8393 - val_loss: 0.4444 - val_accuracy: 0.8225
Epoch 135/200
225/225 - 0s - loss: 0.3957 - accuracy: 0.8415 - val_loss: 0.4431 - val_accuracy: 0.8188
Epoch 136/200
225/225 - 0s - loss: 0.3973 - accuracy: 0.8417 - val_loss: 0.4473 - val_accuracy: 0.8163
Epoch 137/200
225/225 - 0s - loss: 0.3958 - accuracy: 0.8444 - val_loss: 0.4440 - val_accuracy: 0.8163
Epoch 138/200
225/225 - 0s - loss: 0.3941 - accuracy: 0.8432 - val_loss: 0.4488 - val_accuracy: 0.8125
Epoch 139/200
225/225 - 0s - loss: 0.3956 - accuracy: 0.8413 - val_loss: 0.4434 - val_accuracy: 0.8200
Epoch 140/200
225/225 - 0s - loss: 0.3955 - accuracy: 0.8417 - val_loss: 0.4482 - val_accuracy: 0.8175
Epoch 141/200
225/225 - 0s - loss: 0.3978 - accuracy: 0.8407 - val_loss: 0.4456 - val_accuracy: 0.8188
Epoch 142/200
225/225 - 0s - loss: 0.3991 - accuracy: 0.8383 - val_loss: 0.4452 - val_accuracy: 0.8150
Epoch 143/200
225/225 - 0s - loss: 0.3989 - accuracy: 0.8396 - val_loss: 0.4412 - val_accuracy: 0.8138
Epoch 144/200
225/225 - 0s - loss: 0.3990 - accuracy: 0.8379 - val_loss: 0.4390 - val_accuracy: 0.8225
Epoch 145/200
225/225 - 0s - loss: 0.4006 - accuracy: 0.8400 - val_loss: 0.4417 - val_accuracy: 0.8200
Epoch 146/200
225/225 - 0s - loss: 0.3991 - accuracy: 0.8413 - val_loss: 0.4426 - val_accuracy: 0.8213
Epoch 147/200
225/225 - 0s - loss: 0.3960 - accuracy: 0.8392 - val_loss: 0.4437 - val_accuracy: 0.8213
Epoch 148/200
225/225 - 0s - loss: 0.4005 - accuracy: 0.8392 - val_loss: 0.4446 - val_accuracy: 0.8188
Epoch 149/200
225/225 - 0s - loss: 0.3971 - accuracy: 0.8429 - val_loss: 0.4521 - val_accuracy: 0.8112
Epoch 150/200
225/225 - 0s - loss: 0.3957 - accuracy: 0.8417 - val_loss: 0.4426 - val_accuracy: 0.8175
Epoch 151/200
225/225 - 0s - loss: 0.3979 - accuracy: 0.8390 - val_loss: 0.4384 - val_accuracy: 0.8213
Epoch 152/200
225/225 - 0s - loss: 0.3970 - accuracy: 0.8418 - val_loss: 0.4410 - val_accuracy: 0.8213
Epoch 153/200
225/225 - 0s - loss: 0.3981 - accuracy: 0.8414 - val_loss: 0.4386 - val_accuracy: 0.8200
Epoch 154/200
225/225 - 0s - loss: 0.3960 - accuracy: 0.8419 - val_loss: 0.4424 - val_accuracy: 0.8163
Epoch 155/200
225/225 - 0s - loss: 0.3960 - accuracy: 0.8399 - val_loss: 0.4410 - val_accuracy: 0.8175
Epoch 156/200
225/225 - 0s - loss: 0.3959 - accuracy: 0.8414 - val_loss: 0.4435 - val_accuracy: 0.8213
Epoch 157/200
225/225 - 0s - loss: 0.3945 - accuracy: 0.8425 - val_loss: 0.4418 - val_accuracy: 0.8200
Epoch 158/200
225/225 - 0s - loss: 0.3992 - accuracy: 0.8392 - val_loss: 0.4414 - val_accuracy: 0.8225
Epoch 159/200
225/225 - 0s - loss: 0.3981 - accuracy: 0.8401 - val_loss: 0.4398 - val_accuracy: 0.8238
Epoch 160/200
225/225 - 0s - loss: 0.3932 - accuracy: 0.8417 - val_loss: 0.4363 - val_accuracy: 0.8263
Epoch 161/200
225/225 - 0s - loss: 0.3916 - accuracy: 0.8433 - val_loss: 0.4337 - val_accuracy: 0.8288
Epoch 162/200
225/225 - 0s - loss: 0.3925 - accuracy: 0.8426 - val_loss: 0.4360 - val_accuracy: 0.8263
Epoch 163/200
225/225 - 0s - loss: 0.3936 - accuracy: 0.8418 - val_loss: 0.4322 - val_accuracy: 0.8263
Epoch 164/200
225/225 - 0s - loss: 0.3932 - accuracy: 0.8431 - val_loss: 0.4373 - val_accuracy: 0.8225
Epoch 165/200
225/225 - 0s - loss: 0.3937 - accuracy: 0.8425 - val_loss: 0.4349 - val_accuracy: 0.8275
Epoch 166/200
225/225 - 0s - loss: 0.3880 - accuracy: 0.8474 - val_loss: 0.4365 - val_accuracy: 0.8238
Epoch 167/200
225/225 - 0s - loss: 0.3930 - accuracy: 0.8400 - val_loss: 0.4379 - val_accuracy: 0.8250
Epoch 168/200
225/225 - 0s - loss: 0.3923 - accuracy: 0.8433 - val_loss: 0.4305 - val_accuracy: 0.8275
Epoch 169/200
225/225 - 0s - loss: 0.3927 - accuracy: 0.8432 - val_loss: 0.4413 - val_accuracy: 0.8175
Epoch 170/200
225/225 - 0s - loss: 0.3920 - accuracy: 0.8453 - val_loss: 0.4334 - val_accuracy: 0.8250
Epoch 171/200
225/225 - 0s - loss: 0.3920 - accuracy: 0.8425 - val_loss: 0.4344 - val_accuracy: 0.8250
Epoch 172/200
225/225 - 0s - loss: 0.3887 - accuracy: 0.8465 - val_loss: 0.4327 - val_accuracy: 0.8263
Epoch 173/200
225/225 - 0s - loss: 0.3901 - accuracy: 0.8464 - val_loss: 0.4310 - val_accuracy: 0.8263
Epoch 174/200
225/225 - 0s - loss: 0.3898 - accuracy: 0.8457 - val_loss: 0.4314 - val_accuracy: 0.8263
Epoch 175/200
225/225 - 0s - loss: 0.3907 - accuracy: 0.8446 - val_loss: 0.4267 - val_accuracy: 0.8300
Epoch 176/200
225/225 - 0s - loss: 0.3882 - accuracy: 0.8465 - val_loss: 0.4341 - val_accuracy: 0.8225
Epoch 177/200
225/225 - 0s - loss: 0.3891 - accuracy: 0.8443 - val_loss: 0.4301 - val_accuracy: 0.8300
Epoch 178/200
225/225 - 0s - loss: 0.3865 - accuracy: 0.8454 - val_loss: 0.4335 - val_accuracy: 0.8188
Epoch 179/200
225/225 - 0s - loss: 0.3911 - accuracy: 0.8432 - val_loss: 0.4340 - val_accuracy: 0.8225
Epoch 180/200
225/225 - 0s - loss: 0.3907 - accuracy: 0.8422 - val_loss: 0.4286 - val_accuracy: 0.8288
Epoch 181/200
225/225 - 0s - loss: 0.3850 - accuracy: 0.8465 - val_loss: 0.4272 - val_accuracy: 0.8313
Epoch 182/200
225/225 - 0s - loss: 0.3887 - accuracy: 0.8426 - val_loss: 0.4277 - val_accuracy: 0.8313
Epoch 183/200
225/225 - 0s - loss: 0.3896 - accuracy: 0.8444 - val_loss: 0.4251 - val_accuracy: 0.8263
Epoch 184/200
225/225 - 0s - loss: 0.3872 - accuracy: 0.8464 - val_loss: 0.4263 - val_accuracy: 0.8263
Epoch 185/200
225/225 - 0s - loss: 0.3881 - accuracy: 0.8418 - val_loss: 0.4223 - val_accuracy: 0.8275
Epoch 186/200
225/225 - 0s - loss: 0.3892 - accuracy: 0.8435 - val_loss: 0.4232 - val_accuracy: 0.8300
Epoch 187/200
225/225 - 0s - loss: 0.3834 - accuracy: 0.8465 - val_loss: 0.4215 - val_accuracy: 0.8313
Epoch 188/200
225/225 - 0s - loss: 0.3866 - accuracy: 0.8471 - val_loss: 0.4234 - val_accuracy: 0.8263
Epoch 189/200
225/225 - 0s - loss: 0.3897 - accuracy: 0.8442 - val_loss: 0.4209 - val_accuracy: 0.8275
Epoch 190/200
225/225 - 0s - loss: 0.3841 - accuracy: 0.8464 - val_loss: 0.4267 - val_accuracy: 0.8275
Epoch 191/200
225/225 - 0s - loss: 0.3855 - accuracy: 0.8462 - val_loss: 0.4214 - val_accuracy: 0.8300
Epoch 192/200
225/225 - 0s - loss: 0.3840 - accuracy: 0.8476 - val_loss: 0.4277 - val_accuracy: 0.8238
Epoch 193/200
225/225 - 0s - loss: 0.3847 - accuracy: 0.8468 - val_loss: 0.4183 - val_accuracy: 0.8275
Epoch 194/200
225/225 - 0s - loss: 0.3865 - accuracy: 0.8460 - val_loss: 0.4266 - val_accuracy: 0.8238
Epoch 195/200
225/225 - 0s - loss: 0.3878 - accuracy: 0.8449 - val_loss: 0.4191 - val_accuracy: 0.8288
Epoch 196/200
225/225 - 0s - loss: 0.3794 - accuracy: 0.8504 - val_loss: 0.4190 - val_accuracy: 0.8263
Epoch 197/200
225/225 - 0s - loss: 0.3834 - accuracy: 0.8483 - val_loss: 0.4216 - val_accuracy: 0.8300
Epoch 198/200
225/225 - 0s - loss: 0.3839 - accuracy: 0.8482 - val_loss: 0.4192 - val_accuracy: 0.8300
Epoch 199/200
225/225 - 0s - loss: 0.3833 - accuracy: 0.8468 - val_loss: 0.4190 - val_accuracy: 0.8275
Epoch 200/200
225/225 - 0s - loss: 0.3791 - accuracy: 0.8499 - val_loss: 0.4160 - val_accuracy: 0.8313
In [38]:
#predicting the results
y_pred = classifier.predict(X_test)
y_pred = (y_pred > 0.5) #to classify each probability into True or False
In [41]:
# Representation via confusion matrix
from sklearn.metrics import confusion_matrix
cm = confusion_matrix(y_test, y_pred)
print(cm)
print('< ---- Confusion Matrix ---- >')
sns.heatmap(cm, annot=True ,fmt='g')
[[1557   28]
 [ 261  154]]
< ---- Confusion Matrix ---- >
Out[41]:
<matplotlib.axes._subplots.AxesSubplot at 0x297f74fd070>
In [42]:
print (((cm[0][0]+cm[1][1])*100)/(len(y_test)), '% of testing data was classified correctly')
85.55 % of testing data was classified correctly
In [ ]:
 
In [ ]:
 
In [ ]:
 
In [ ]: